Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai-experimental[patch]: Adds experimental raw response field to OpenAI chat models #6087

Merged
merged 4 commits into from
Jul 17, 2024

Conversation

jacoblee93
Copy link
Collaborator

@jacoblee93 jacoblee93 commented Jul 16, 2024

Passing __includeRawResponse: true on instantiation will return the raw response from OpenAI in additional_kwargs.

It's not (and probably won't ever be) exactly 1:1 with OpenAI - we add an extra metadata chunk at the end rather than yielding when OpenAI emits it. The final result of chunk concatenation also doesn't make sense, since it doesn't make sense to merge multiple OpenAI streaming responses.

Copy link

vercel bot commented Jul 16, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 17, 2024 2:29am
langchainjs-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 17, 2024 2:29am

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. auto:improvement Medium size change to existing code to handle new use-cases labels Jul 16, 2024
@@ -28,6 +28,24 @@ import { AzureChatOpenAI } from "../../azure/chat_models.js";
// Save the original value of the 'LANGCHAIN_CALLBACKS_BACKGROUND' environment variable
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey team, I've flagged the added beforeAll block in the PR for review as it explicitly sets environment variables using process.env. This change should be reviewed to ensure it aligns with our environment variable management practices.

@@ -5,6 +5,24 @@ import { AIMessageChunk } from "@langchain/core/messages";
import { AzureChatOpenAI } from "../../azure/chat_models.js";
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey team, I've flagged the recent change in the PR for review as it explicitly accesses and sets environment variables using process.env. Please take a look and ensure it aligns with our best practices for handling environment variables. Thanks!

@@ -1,6 +1,24 @@
import { test, expect } from "@jest/globals";
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! I noticed that the recent PR includes changes to access and set environment variables using process.env. This comment is to flag the change for maintainers to review the handling of environment variables in the code. Great work on the PR!

@@ -15,6 +15,24 @@ import { AzureOpenAI } from "../../azure/llms.js";
// Save the original value of the 'LANGCHAIN_CALLBACKS_BACKGROUND' environment variable
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey team, just a heads up that I've flagged the recent changes in the llms.int.test.ts file for review. The added beforeAll block sets environment variables based on corresponding TEST_ prefixed variables, so it's important to ensure everything is handled correctly. Let me know if you have any questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:improvement Medium size change to existing code to handle new use-cases size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant